Improving upon the effective sample size based on Godambe information for block likelihood inference

نویسندگان

چکیده

We consider the effective sample size, based on Godambe information, for block likelihood inference which is an attractive and computationally feasible alternative to full large correlated datasets. With reference a Gaussian random field having constant mean, we explore how choice of blocks impacts this size. This done by introducing column-wise blocking method spreads out spatial points within each block, instead keeping them close together as existing row-wise does. It seen that can lead considerable gains in size efficiency compared blocking, while retaining computational simplicity. Analytical results direction are obtained under AR (1) model. The insights so found facilitate study other one-dimensional correlation models well plane, where closed form expressions intractable. Simulations provide support our conclusions.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal sample size and censoring scheme in progressively type II censoring based on Fisher information for the Pareto distribution

One of the most common censoring methods is the progressive type-II censoring. In this method of censoring, a total of $n$ units are placed on the test, and at the time of failure of each unit, some of the remaining units are randomly removed. This will continue to record $m$ failure times, where $m$ is a pre-determined value, and then the experiment ends. The problem of determining the optimal...

متن کامل

Effective sample size for importance sampling based on discrepancy measures

The Effective Sample Size (ESS) is an important measure of efficiency of Monte Carlo methods such as Markov Chain Monte Carlo (MCMC) and Importance Sampling (IS) techniques. In the IS context, an approximation ÊSS of the theoretical ESS definition is widely applied, involving the inverse of the sum of the squares of the normalized importance weights. This formula, ÊSS, has become an essential p...

متن کامل

Optimum Block Size in Separate Block Bootstrap to Estimate the Variance of Sample Mean for Lattice Data

The statistical analysis of spatial data is usually done under Gaussian assumption for the underlying random field model. When this assumption is not satisfied, block bootstrap methods can be used to analyze spatial data. One of the crucial problems in this setting is specifying the block sizes. In this paper, we present asymptotic optimal block size for separate block bootstrap to estimate the...

متن کامل

Phylogenetic effective sample size.

In this paper I address the question-how large is a phylogenetic sample? I propose a definition of a phylogenetic effective sample size for Brownian motion and Ornstein-Uhlenbeck processes-the regression effective sample size. I discuss how mutual information can be used to define an effective sample size in the non-normal process case and compare these two definitions to an already present con...

متن کامل

Improving upon probability weighting for household size

By comparing data from national telephone polls to Census gures on household size (number of adults in household), we nd large diierences between population and sample, even after weighting respondents proportional to household size. This presumably occurs because larger households are easier to reach and more likely to respond to the survey. If a user wishes to weight on household size, we rec...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Statistics

سال: 2023

ISSN: ['0943-4062', '1613-9658']

DOI: https://doi.org/10.1007/s00180-023-01328-6